Evolving Vision-Based Flying Robots

نویسندگان

  • Jean-Christophe Zufferey
  • Dario Floreano
  • Matthijs van Leeuwen
  • Tancredi Merenda
چکیده

We describe a new experimental approach whereby an indoor flying robot evolves the ability to navigate in a textured room using only visual information and neuromorphic control. The architecture of a spiking neural circuit, which is connected to the vision system and to the motors, is genetically encoded and evolved on the physical robot without human intervention. The flying robot consists of a small wireless airship equipped with a linear camera and a set of sensors used to measure its performance. Evolved spiking circuits can manage to fly the robot around the room by exploiting a combination of visual features, robot morphology, and interaction dynamics. 1 Bio-Inspired Vision for Flying Robots Our goal is to develop vision-based navigation systems for autonomous miniature (below 80 cm, 50 g) flying robots [1]. Some research teams are working on even smaller dimensions [2,3,4], but their efforts are essentially concentrated on mechatronics issues. A major challenge for miniature flying robots is the ability to navigate autonomously in complex environments. Conventional distance sensors (laser, ultrasonic) cannot be used for these systems because of their weight. Vision is an interesting sensor modality because it can be lightweight and low-power. However, the mainstream approach to computer vision, based on segmentation, object extraction, and pattern recognition, is not always suitable for small behavioural systems that are unable to carry powerful processors and related energy sources. An alternative consists in taking inspiration from simple circuits and adaptive mechanisms used by living organisms [5]. A pioneering work in this direction was achieved by Franceschini et al. [6], who developed a wheeled robot with a vision system inspired upon the visual system of the fly. The 10 kg synchrodrive robot featured an artificial compound eye with 100 discrete photoreceptors and was able to freely navigate at about 50 cm/s toward a light source while avoiding randomly arranged obstacles. Other successful realisations followed (for review, see [7]), but unlike the robot by Franceschini et al., where computation was executed onboard by analog electronics, those machines demanded more computing power and were therefore linked to offboard computers for image processing. More recent work uses bio-inspired visual algorithms in flying agents, H.H. Bülthoff et al. (Eds.): BMCV 2002, LNCS 2525, pp. 592–600, 2002. c © Springer-Verlag Berlin Heidelberg 2002 Evolving Vision-Based Flying Robots 593 Fig. 1. Evolutionary vision-based robots. Left : The Khepera robot equipped with a linear camera (16 pixels, 36◦ FOV) was positioned in an arena with randomly sized black and white stripes. Random size was used to prevent development of trivial solutions whereby the control system would use the size of the stripes to measure distance from walls and self-motion. The robot was connected to a workstation through rotating contacts that provided serial data transmission and power supply. Right : The blimp-like flying robot, provided with a similar linear camera (16 pixels, 150◦ FOV), is closed in a 5x5x3 m room with randomly sized black and white stripes on the walls. The serial data transmission is handled by a Bluetooth wireless connection and power supply by an onboard battery. but these developments have been limited to tethered aircraft [8] and simulated flight [9]. The control systems of the robots mentioned above were ‘hand-designed’. Some authors proposed to evolve vision-based navigation capabilities [10,11]. For example, Huber applied genetic algorithms to simulated 2D agents [12]. Those agents were equipped with only four photoreceptors making up two elementary motion detectors (EMD), symmetrically placed on each side of the agent. The parameters of those EMDs as well as the position and field of view (FOV) of the photoreceptors were evolved. The best individuals could successfully navigate in a simulated corridor with textured walls and obstacles. The simulation was rather simple though, especially because inertial forces were not taken into consideration. In previous work [13], we evolved the architecture of spiking neural networks capable of steering a vision-based, wheeled robot. A Khepera robot with a linear camera was asked to navigate in a rectangular arena with textured walls (figure 1, left). The best individuals were capable of moving forward and avoiding walls very reliably. However, the complexity of the dynamics of this terrestrial robot is much simpler than that of flying agents. In this paper, we extend that approach to a flying robot (figure 1, right) that is expected to navigate within a room using only visual information. Genetic algorithms [14] are used to evolve the architecture of a spiking circuit, which connects low resolution visual input to the motors of a small indoor airship. Notice that other teams are using blimps for studying insect-like vision-based 594 J.-C. Zufferey et al. navigation [15,16], but none of them are applying the concepts of evolutionary robotics [17]. In the following section, we describe the main challenges of running evolution with real flying systems and give an overview of the experimental setup. Section 3 summarizes the evolutionary and neural dynamics. The results are presented in section 4. Finally, a discussion and future work description are given in section 5. Fig. 2. The blimp features an ellipsoid envelope (100x60x45 cm) filled with helium for a lift capacity of approximately 250 g. On top and below the envelope are attached frames made of carbon fibre rods that support six bumpers (4 on top and 2 below) for collision detection. It is equipped with three engines (miniature DC motor, gear, propeller): two for horizontal movement (forward, backward and rotation around yaw axis) and one for vertical movement. In order to measure the relative forward airspeed, an anemometer (free rotating balsa-wood propeller with a mouse optical encoder) has been mounted on top of the envelope. The system is able to qualitatively measure airspeeds above 5 cm/s. A distance sensor has been mounted below the gondola and oriented toward the floor for altitude control in the preliminary experiments. 2 Experimental Setup Evolving aerial robots brings a new set of challenges. The major issues of developing (evolving, learning) a control system for an airship, with respect to a wheeled robot, are (1) the extension to three dimensions1, (2) the impossibility to communicate to a computer via cables, (3) the difficulty of defining and measuring performance, and (4) the more complex dynamics. For example, while the Khepera is controlled in speed, the blimp is controlled in thrust (speed derivative) and can slip sideways. Moreover, inertial and aerodynamic forces play a major role. Artificial evolution is a promising method to automatically develop 1 Although the first experiments described hereafter are limited to 2D by the use of a pre-designed altitude regulator. Evolving Vision-Based Flying Robots 595 Fig. 3. Left: The blimp and its main components: the anemometer on top of the envelope, the linear camera pointing forward with 150◦ FOV giving a horizontal image of the vertical stripes, the bumpers and propellers. Right: Contrast detection is performed by selecting 16 equally spaced photoreceptors and filtering them through a Laplace filter spanning three photoreceptors. Filtered values are then rectified by taking the absolute value and scaling them in the range [0,1]. These values represent the probability of emitting a spike for each corresponding neuron. A linear camera fixed on the gondola is the only source of information for the evolutionary spiking network. control systems for complex robots [17], but it requires machines that are capable of moving for long periods of time without human intervention and withstand shocks. Those requirements led us to the development of the blimp shown in figure 2. All onboard electronic components are connected to a Microchip PICTM microcontroller with a wireless connection to a desktop computer. The bidirectional digital communication with the computer is handled by a BluetoothTM radio module, allowing more than 15 m range. The energy is provided by a Lithium-Ion battery, which lasts more than 3 hours under normal operation, during evolutionary runs. For purpose of analysis, the evolutionary algorithm and spiking circuits are implemented on the desktop computer which exchanges sensory data and motor commands with the blimp every 100 ms.2 In these experiments, a simple linear camera is attached in front of the gondola (figure 3), pointing forward. The fish-eye-view lens gives a horizontal 150◦ FOV mapped onto 16 photoreceptors (subsampled from about 50 active pixels) whose activations are convolved with a Laplace filter. The Laplace filter detects contrast over three adjacent photoreceptors. 2 An adapted form of the evolutionary algorithm and spiking circuit could be run on the onboard microcontroller [18], but data analysis would be limited. 596 J.-C. Zufferey et al. . . . Sensory receptor Motor output .. . . . . . . . . . . Sensory receptors Neurons (t-1)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dynamics of Space Free-Flying Robots with Flexible Appendages

A Space Free-Flying Robot (SFFR) includes an actuated base equipped with one or more manipulators to perform on-orbit missions. Distinct from fixed-based manipulators, the spacecraft (base) of a SFFR responds to dynamic reaction forces due to manipulator motions. In order to control such a system, it is essential to consider the dynamic coupling between the manipulators and the base. Explicit d...

متن کامل

Bio-inspired Vision-based Flying Robots

There are not yet autonomous flying robots capable of manoeuvring in small cluttered environments as insects do. Encouraged by this observation, this thesis presents the development of ultra-light flying robots and control systems going one step toward fully autonomous indoor aerial navigation. The substantial weight and energy constraints imposed by this indoor flying robots preclude the use o...

متن کامل

From Wheels to Wings with Evolutionary Spiking Circuits

We give an overview of the EPFL indoor flying project, whose goal is to evolve neural controllers for autonomous, adaptive, indoor micro-flyers. Indoor flight is still a challenge because it requires miniaturization, energy efficiency, and control of nonlinear flight dynamics. This ongoing project consists of developing a flying, vision-based micro-robot, a bio-inspired controller composed of a...

متن کامل

Neural sensorimotor primitives for vision-controlled flying robots

Vision-based control of aerial robots adds the challenge of visual sensing on top of the basic control problems. Here we pursue the approach of using sensor-coupled motion primitives in order to tie together both the sensor and motor components of flying robots to leverage sensorimotor interaction in a fundamental manner. We also consider the temporal scale over which autonomy is to be achieved...

متن کامل

Computer Vision for Mobile Robot Navigation

Vision is a very important sense for humans. Robots also need to be aware of their environment for working in it. This is even more important for mobile robots that operate in unknown environments that are not prepared for robots. In this paper, we focus on the aspect of navigation of mobile robots for applications in search and rescue as well as planetary exploration. We discuss a typical visi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002